4 research outputs found

    Insect-inspired visual navigation on-board an autonomous robot: real-world routes encoded in a single layer network

    Get PDF
    Insect-Inspired models of visual navigation, that operate by scanning for familiar views of the world, have been shown to be capable of robust route navigation in simulation. These familiarity-based navigation algorithms operate by training an artificial neural network (ANN) with views from a training route, so that it can then output a familiarity score for any new view. In this paper we show that such an algorithm – with all computation performed on a small low-power robot – is capable of delivering reliable direction information along real-world outdoor routes, even when scenes contain few local landmarks and have high-levels of noise (from variable lighting and terrain). Indeed, routes can be precisely recapitulated and we show that the required computation and storage does not increase with the number of training views. Thus the ANN provides a compact representation of the knowledge needed to traverse a route. In fact, rather than losing information, there are instances where the use of an ANN ameliorates the problems of sub optimal paths caused by tortuous training routes. Our results suggest the feasibility of familiarity-based navigation for long-range autonomous visual homing
    corecore